AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Cross-modal distillation

# Cross-modal distillation

Tinyclip ViT 8M 16 Text 3M YFCC15M
MIT
TinyCLIP is an innovative cross-modal distillation method for large-scale language-image pre-trained models, achieving optimal balance between speed and accuracy through affinity mimicking and weight inheritance techniques.
Text-to-Image Transformers
T
wkcn
56.32k
9
Tinyclip ViT 61M 32 Text 29M LAION400M
MIT
TinyCLIP is an innovative cross-modal distillation method for large-scale vision-language pretraining models, achieving optimal balance between speed and accuracy through affinity mimicking and weight inheritance techniques.
Text-to-Image Transformers
T
wkcn
950
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase